Explicit Cutoff Regularization in Coordinate Representation

نویسندگان

چکیده

Abstract In this paper, we study a special type of cutoff regularization in the coordinate representation. We show how approach unites such concepts and properties as an explicit cut, spectral representation, homogenization, covariance. Besides that, present new formulae to work with give additional calculations infrared asymptotics for some regularized Green’s functions appearing pure four-dimensional Yang–Mills theory standard two-dimensional Sigma-model.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Explicit expanders with cutoff phenomena

The cutoff phenomenon describes a sharp transition in the convergence of an ergodic finite Markov chain to equilibrium. Of particular interest is understanding this convergence for the simple random walk on a bounded-degree expander graph. The first example of a family of bounded-degree graphs where the random walk exhibits cutoff in total-variation was provided only very recently, when the aut...

متن کامل

Explicit Neural Word Representation

Recent advances in word embedding provide significant benefit to various information processing tasks. Yet these dense representations and their estimation of word-to-word relatedness remain difficult to interpret and hard to analyze. As an alternative, explicit word representations propose vectors whose dimensions are easily interpretable, and recent methods show competitive performance to the...

متن کامل

Continuum QRPA in the coordinate space representation

We formulate a quasi-particle random phase approximation (QRPA) in the coordinate space representation. This model is a natural extension of the RPA model of Shlomo and Bertsch to open-shell nuclei in order to take into account pairing correlations together with the coupling to the continuum. We apply it to the 120Sn nucleus and show that low-lying excitation modes are significantly influenced ...

متن کامل

Feature Incay for Representation Regularization

Softmax loss is widely used in deep neural networks for multi-class classification, where each class is represented by a weight vector, a sample is represented as a feature vector, and the feature vector has the largest projection on the weight vector of the correct category when the model correctly classifies a sample. To ensure generalization, weight decay that shrinks the weight norm is ofte...

متن کامل

Binarized Representation Entropy (bre) Regularization

We propose a novel regularizer to improve the training of Generative Adversarial Networks (GANs). The motivation is that when the discriminatorD spreads out its model capacity in the right way, the learning signals given to the generator G are more informative and diverse. These in turn help G to explore better and discover the real data manifold while avoiding large unstable jumps due to the e...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Physics A

سال: 2022

ISSN: ['1751-8113', '1751-8121']

DOI: https://doi.org/10.1088/1751-8121/aca8dc